30,083 research outputs found

    Progressive Label Distillation: Learning Input-Efficient Deep Neural Networks

    Get PDF
    Much of the focus in the area of knowledge distillation has been on distilling knowledge from a larger teacher network to a smaller student network. However, there has been little research on how the concept of distillation can be leveraged to distill the knowledge encapsulated in the training data itself into a reduced form. In this study, we explore the concept of progressive label distillation, where we leverage a series of teacher-student network pairs to progressively generate distilled training data for learning deep neural networks with greatly reduced input dimensions. To investigate the efficacy of the proposed progressive label distillation approach, we experimented with learning a deep limited vocabulary speech recognition network based on generated 500ms input utterances distilled progressively from 1000ms source training data, and demonstrated a significant increase in test accuracy of almost 78% compared to direct learning.Comment: 9 page

    Quantum measurement in two-dimensional conformal field theories: Application to quantum energy teleportation

    Get PDF
    We construct a set of quasi-local measurement operators in 2D CFT, and then use them to proceed the quantum energy teleportation (QET) protocol and show it is viable. These measurement operators are constructed out of the projectors constructed from shadow operators, but further acting on the product of two spatially separated primary fields. They are equivalently the OPE blocks in the large central charge limit up to some UV-cutoff dependent normalization but the associated probabilities of outcomes are UV-cutoff independent. We then adopt these quantum measurement operators to show that the QET protocol is viable in general. We also check the CHSH inequality a la OPE blocks.Comment: match the version published on PLB, the main conclusion didn't change, some techincal details can be found in the previous versio

    Basis Expansions for Functional Snippets

    Full text link
    Estimation of mean and covariance functions is fundamental for functional data analysis. While this topic has been studied extensively in the literature, a key assumption is that there are enough data in the domain of interest to estimate both the mean and covariance functions. In this paper, we investigate mean and covariance estimation for functional snippets in which observations from a subject are available only in an interval of length strictly (and often much) shorter than the length of the whole interval of interest. For such a sampling plan, no data is available for direct estimation of the off-diagonal region of the covariance function. We tackle this challenge via a basis representation of the covariance function. The proposed approach allows one to consistently estimate an infinite-rank covariance function from functional snippets. We establish the convergence rates for the proposed estimators and illustrate their finite-sample performance via simulation studies and two data applications.Comment: 51 pages, 10 figure
    corecore